How to Price Information by Kullback-leibler Entropy and a Moment-return Relation for Portfolios
نویسندگان
چکیده
A connection between the notion of information and the concept of risk and return in portfolio theory is deduced. This succeeds in two steps: A general moment-return relation for arbitrary assets is derived, thereafter the total expected return is connected to the Kullback-Leibler information. With this result the optimization problem to maximize the expected return of a portfolio consisting of n subportfolios by moment variation under a given value-at-risk constraint is solved. This yields an ansatz to price information.
منابع مشابه
Comparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil
In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...
متن کاملDynamic Bayesian Information Measures
This paper introduces measures of information for Bayesian analysis when the support of data distribution is truncated progressively. The focus is on the lifetime distributions where the support is truncated at the current age t>=0. Notions of uncertainty and information are presented and operationalized by Shannon entropy, Kullback-Leibler information, and mutual information. Dynamic updatings...
متن کاملRényi Divergence and Kullback-Leibler Divergence
Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibl...
متن کاملTesting Exponentiality Based on Renyi Entropy of Transformed Data
In this paper, we introduce new tests for exponentiality based on estimators of Renyi entropy of a continuous random variable. We first consider two transformations of the observations which turn the test of exponentiality into one of uniformity and use a corresponding test based on Renyi entropy. Critical values of the test statistics are computed by Monte Carlo simulations. Then, we compare p...
متن کاملINFORMATION MEASURES BASED TOPSIS METHOD FOR MULTICRITERIA DECISION MAKING PROBLEM IN INTUITIONISTIC FUZZY ENVIRONMENT
In the fuzzy set theory, information measures play a paramount role in several areas such as decision making, pattern recognition etc. In this paper, similarity measure based on cosine function and entropy measures based on logarithmic function for IFSs are proposed. Comparisons of proposed similarity and entropy measures with the existing ones are listed. Numerical results limpidly betoken th...
متن کامل